Skip to content

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Oct 15, 2025

Bumps databricks-sdk from 0.12.0 to 0.68.0.

Release notes

Sourced from databricks-sdk's releases.

v0.68.0

Release v0.68.0

New Features and Improvements

  • Add native support for authentication through Azure DevOps OIDC.

Bug Fixes

  • Fix a security issue that resulted in bearer tokens being logged in exception messages.

API Changes

  • Add databricks.sdk.service.dataquality package.
  • Add w.data_quality workspace-level service.
  • Add create_update() and get_update() methods for w.apps workspace-level service.
  • Add compute_size field for databricks.sdk.service.apps.App.
  • Add genie_space field for databricks.sdk.service.apps.AppResource.
  • Add skip_validation field for databricks.sdk.service.catalog.AccountsCreateStorageCredential.
  • Add skip_validation field for databricks.sdk.service.catalog.AccountsUpdateStorageCredential.
  • Add aliases, browse_only, created_at, created_by, full_name, metastore_id, owner, updated_at and updated_by fields for databricks.sdk.service.catalog.CreateRegisteredModelRequest.
  • Add catalog_name, id, model_name and schema_name fields for databricks.sdk.service.catalog.RegisteredModelAlias.
  • Add aliases, catalog_name, created_at, created_by, id, metastore_id, model_name, model_version_dependencies, run_id, run_workspace_id, schema_name, source, status, storage_location, updated_at and updated_by fields for databricks.sdk.service.catalog.UpdateModelVersionRequest.
  • Add aliases, browse_only, catalog_name, created_at, created_by, metastore_id, name, schema_name, storage_location, updated_at and updated_by fields for databricks.sdk.service.catalog.UpdateRegisteredModelRequest.
  • Add key_region field for databricks.sdk.service.provisioning.CreateAwsKeyInfo.
  • Add role_arn field for databricks.sdk.service.provisioning.CreateStorageConfigurationRequest.
  • Add azure_key_info field for databricks.sdk.service.provisioning.CustomerManagedKey.
  • [Breaking] Add customer_facing_private_access_settings field for databricks.sdk.service.provisioning.ReplacePrivateAccessSettingsRequest.
  • Add role_arn field for databricks.sdk.service.provisioning.StorageConfiguration.
  • [Breaking] Add customer_facing_workspace field for databricks.sdk.service.provisioning.UpdateWorkspaceRequest.
  • Add update_mask field for databricks.sdk.service.provisioning.UpdateWorkspaceRequest.
  • Add compute_mode, network, network_connectivity_config_id and storage_mode fields for databricks.sdk.service.provisioning.Workspace.
  • Add enable_serverless_compute field for databricks.sdk.service.sql.GetWorkspaceWarehouseConfigResponse.
  • Add page_size and page_token fields for databricks.sdk.service.sql.ListWarehousesRequest.
  • Add next_page_token field for databricks.sdk.service.sql.ListWarehousesResponse.
  • Add enable_serverless_compute field for databricks.sdk.service.sql.SetWorkspaceWarehouseConfigRequest.
  • Add model_version_status_unknown enum value for databricks.sdk.service.catalog.ModelVersionInfoStatus.
  • Add k8s_active_pod_quota_exceeded and cloud_account_pod_quota_exceeded enum values for databricks.sdk.service.compute.TerminationReasonCode.
  • Add internal_catalog_asset_creation_ongoing_exception, internal_catalog_asset_creation_failed_exception and internal_catalog_asset_creation_unsupported_exception enum values for databricks.sdk.service.dashboards.MessageErrorType.
  • Add ssh_bootstrap_failure, aws_inaccessible_kms_key_failure, init_container_not_finished, spark_image_download_throttled, spark_image_not_found, cluster_operation_throttled, cluster_operation_timeout, serverless_long_running_terminated, azure_packed_deployment_partial_failure, invalid_worker_image_failure, workspace_update, invalid_aws_parameter, driver_out_of_disk, driver_out_of_memory, driver_launch_timeout, driver_unexpected_failure, unexpected_pod_recreation, gcp_inaccessible_kms_key_failure, gcp_kms_key_permission_denied, driver_eviction, user_initiated_vm_termination, gcp_iam_timeout, aws_resource_quota_exceeded, cloud_account_setup_failure, aws_invalid_key_pair, driver_pod_creation_failure, maintenance_mode, internal_capacity_failure, executor_pod_unscheduled, storage_download_failure_slow, storage_download_failure_throttled, dynamic_spark_conf_size_exceeded, aws_instance_profile_update_failure, instance_pool_not_found, instance_pool_max_capacity_reached, aws_invalid_kms_key_state, gcp_insufficient_capacity, gcp_api_rate_quota_exceeded, gcp_resource_quota_exceeded, gcp_ip_space_exhausted, gcp_service_account_access_denied, gcp_service_account_not_found, gcp_forbidden, gcp_not_found, resource_usage_blocked, data_access_config_changed, access_token_failure, invalid_instance_placement_protocol, budget_policy_resolution_failure, in_penalty_box, disaster_recovery_replication, bootstrap_timeout_due_to_misconfig, instance_unreachable_due_to_misconfig, storage_download_failure_due_to_misconfig, control_plane_request_failure_due_to_misconfig, cloud_provider_launch_failure_due_to_misconfig, gcp_subnet_not_ready, cloud_operation_cancelled, cloud_provider_instance_not_launched, gcp_trusted_image_projects_violated, budget_policy_limit_enforcement_activated, eos_spark_image, no_matched_k8s, lazy_allocation_timeout, driver_node_unreachable, secret_creation_failure, pod_scheduling_failure, pod_assignment_failure, allocation_timeout, allocation_timeout_no_unallocated_clusters, allocation_timeout_no_matched_clusters, allocation_timeout_no_ready_clusters, allocation_timeout_no_warmed_up_clusters, allocation_timeout_node_daemon_not_ready, allocation_timeout_no_healthy_clusters, netvisor_setup_timeout, no_matched_k8s_testing_tag, cloud_provider_resource_stockout_due_to_misconfig, gke_based_cluster_termination, allocation_timeout_no_healthy_and_warmed_up_clusters, docker_invalid_os_exception, docker_container_creation_exception, docker_image_too_large_for_instance_exception, dns_resolution_error, gcp_denied_by_org_policy, secret_permission_denied, network_check_nic_failure, network_check_dns_server_failure, network_check_storage_failure, network_check_metadata_endpoint_failure, network_check_control_plane_failure, network_check_multiple_components_failure, driver_unhealthy, security_agents_failed_initial_verification, driver_dns_resolution_failure, no_activated_k8s, usage_policy_entitlement_denied, no_activated_k8s_testing_tag, k8s_active_pod_quota_exceeded and cloud_account_pod_quota_exceeded enum values for databricks.sdk.service.sql.TerminationReasonCode.
  • [Breaking] Change create() method for a.account_metastore_assignments account-level service to start returning databricks.sdk.service.catalog.AccountsCreateMetastoreAssignmentResponse dataclass.
  • [Breaking] Change delete() method for a.account_metastore_assignments account-level service to start returning databricks.sdk.service.catalog.AccountsDeleteMetastoreAssignmentResponse dataclass.
  • [Breaking] Change update() method for a.account_metastore_assignments account-level service to start returning databricks.sdk.service.catalog.AccountsUpdateMetastoreAssignmentResponse dataclass.
  • [Breaking] Change create() method for a.account_metastores account-level service to return databricks.sdk.service.catalog.AccountsCreateMetastoreResponse dataclass.
  • [Breaking] Change delete() method for a.account_metastores account-level service to start returning databricks.sdk.service.catalog.AccountsDeleteMetastoreResponse dataclass.
  • [Breaking] Change get() method for a.account_metastores account-level service to return databricks.sdk.service.catalog.AccountsGetMetastoreResponse dataclass.
  • [Breaking] Change list() method for a.account_metastores account-level service to return databricks.sdk.service.catalog.AccountsListMetastoresResponse dataclass.
  • [Breaking] Change update() method for a.account_metastores account-level service to return databricks.sdk.service.catalog.AccountsUpdateMetastoreResponse dataclass.
  • [Breaking] Change create() method for a.account_storage_credentials account-level service to return databricks.sdk.service.catalog.AccountsCreateStorageCredentialInfo dataclass.
  • [Breaking] Change delete() method for a.account_storage_credentials account-level service to start returning databricks.sdk.service.catalog.AccountsDeleteStorageCredentialResponse dataclass.
  • [Breaking] Change update() method for a.account_storage_credentials account-level service to return databricks.sdk.service.catalog.AccountsUpdateStorageCredentialResponse dataclass.
  • [Breaking] Change create() method for w.registered_models workspace-level service with new required argument order.

... (truncated)

Changelog

Sourced from databricks-sdk's changelog.

Release v0.68.0

New Features and Improvements

  • Add native support for authentication through Azure DevOps OIDC.

Bug Fixes

  • Fix a security issue that resulted in bearer tokens being logged in exception messages.

API Changes

  • Add databricks.sdk.service.dataquality package.
  • Add w.data_quality workspace-level service.
  • Add create_update() and get_update() methods for w.apps workspace-level service.
  • Add compute_size field for databricks.sdk.service.apps.App.
  • Add genie_space field for databricks.sdk.service.apps.AppResource.
  • Add skip_validation field for databricks.sdk.service.catalog.AccountsCreateStorageCredential.
  • Add skip_validation field for databricks.sdk.service.catalog.AccountsUpdateStorageCredential.
  • Add aliases, browse_only, created_at, created_by, full_name, metastore_id, owner, updated_at and updated_by fields for databricks.sdk.service.catalog.CreateRegisteredModelRequest.
  • Add catalog_name, id, model_name and schema_name fields for databricks.sdk.service.catalog.RegisteredModelAlias.
  • Add aliases, catalog_name, created_at, created_by, id, metastore_id, model_name, model_version_dependencies, run_id, run_workspace_id, schema_name, source, status, storage_location, updated_at and updated_by fields for databricks.sdk.service.catalog.UpdateModelVersionRequest.
  • Add aliases, browse_only, catalog_name, created_at, created_by, metastore_id, name, schema_name, storage_location, updated_at and updated_by fields for databricks.sdk.service.catalog.UpdateRegisteredModelRequest.
  • Add key_region field for databricks.sdk.service.provisioning.CreateAwsKeyInfo.
  • Add role_arn field for databricks.sdk.service.provisioning.CreateStorageConfigurationRequest.
  • Add azure_key_info field for databricks.sdk.service.provisioning.CustomerManagedKey.
  • [Breaking] Add customer_facing_private_access_settings field for databricks.sdk.service.provisioning.ReplacePrivateAccessSettingsRequest.
  • Add role_arn field for databricks.sdk.service.provisioning.StorageConfiguration.
  • [Breaking] Add customer_facing_workspace field for databricks.sdk.service.provisioning.UpdateWorkspaceRequest.
  • Add update_mask field for databricks.sdk.service.provisioning.UpdateWorkspaceRequest.
  • Add compute_mode, network, network_connectivity_config_id and storage_mode fields for databricks.sdk.service.provisioning.Workspace.
  • Add enable_serverless_compute field for databricks.sdk.service.sql.GetWorkspaceWarehouseConfigResponse.
  • Add page_size and page_token fields for databricks.sdk.service.sql.ListWarehousesRequest.
  • Add next_page_token field for databricks.sdk.service.sql.ListWarehousesResponse.
  • Add enable_serverless_compute field for databricks.sdk.service.sql.SetWorkspaceWarehouseConfigRequest.
  • Add model_version_status_unknown enum value for databricks.sdk.service.catalog.ModelVersionInfoStatus.
  • Add k8s_active_pod_quota_exceeded and cloud_account_pod_quota_exceeded enum values for databricks.sdk.service.compute.TerminationReasonCode.
  • Add internal_catalog_asset_creation_ongoing_exception, internal_catalog_asset_creation_failed_exception and internal_catalog_asset_creation_unsupported_exception enum values for databricks.sdk.service.dashboards.MessageErrorType.
  • Add ssh_bootstrap_failure, aws_inaccessible_kms_key_failure, init_container_not_finished, spark_image_download_throttled, spark_image_not_found, cluster_operation_throttled, cluster_operation_timeout, serverless_long_running_terminated, azure_packed_deployment_partial_failure, invalid_worker_image_failure, workspace_update, invalid_aws_parameter, driver_out_of_disk, driver_out_of_memory, driver_launch_timeout, driver_unexpected_failure, unexpected_pod_recreation, gcp_inaccessible_kms_key_failure, gcp_kms_key_permission_denied, driver_eviction, user_initiated_vm_termination, gcp_iam_timeout, aws_resource_quota_exceeded, cloud_account_setup_failure, aws_invalid_key_pair, driver_pod_creation_failure, maintenance_mode, internal_capacity_failure, executor_pod_unscheduled, storage_download_failure_slow, storage_download_failure_throttled, dynamic_spark_conf_size_exceeded, aws_instance_profile_update_failure, instance_pool_not_found, instance_pool_max_capacity_reached, aws_invalid_kms_key_state, gcp_insufficient_capacity, gcp_api_rate_quota_exceeded, gcp_resource_quota_exceeded, gcp_ip_space_exhausted, gcp_service_account_access_denied, gcp_service_account_not_found, gcp_forbidden, gcp_not_found, resource_usage_blocked, data_access_config_changed, access_token_failure, invalid_instance_placement_protocol, budget_policy_resolution_failure, in_penalty_box, disaster_recovery_replication, bootstrap_timeout_due_to_misconfig, instance_unreachable_due_to_misconfig, storage_download_failure_due_to_misconfig, control_plane_request_failure_due_to_misconfig, cloud_provider_launch_failure_due_to_misconfig, gcp_subnet_not_ready, cloud_operation_cancelled, cloud_provider_instance_not_launched, gcp_trusted_image_projects_violated, budget_policy_limit_enforcement_activated, eos_spark_image, no_matched_k8s, lazy_allocation_timeout, driver_node_unreachable, secret_creation_failure, pod_scheduling_failure, pod_assignment_failure, allocation_timeout, allocation_timeout_no_unallocated_clusters, allocation_timeout_no_matched_clusters, allocation_timeout_no_ready_clusters, allocation_timeout_no_warmed_up_clusters, allocation_timeout_node_daemon_not_ready, allocation_timeout_no_healthy_clusters, netvisor_setup_timeout, no_matched_k8s_testing_tag, cloud_provider_resource_stockout_due_to_misconfig, gke_based_cluster_termination, allocation_timeout_no_healthy_and_warmed_up_clusters, docker_invalid_os_exception, docker_container_creation_exception, docker_image_too_large_for_instance_exception, dns_resolution_error, gcp_denied_by_org_policy, secret_permission_denied, network_check_nic_failure, network_check_dns_server_failure, network_check_storage_failure, network_check_metadata_endpoint_failure, network_check_control_plane_failure, network_check_multiple_components_failure, driver_unhealthy, security_agents_failed_initial_verification, driver_dns_resolution_failure, no_activated_k8s, usage_policy_entitlement_denied, no_activated_k8s_testing_tag, k8s_active_pod_quota_exceeded and cloud_account_pod_quota_exceeded enum values for databricks.sdk.service.sql.TerminationReasonCode.
  • [Breaking] Change create() method for a.account_metastore_assignments account-level service to start returning databricks.sdk.service.catalog.AccountsCreateMetastoreAssignmentResponse dataclass.
  • [Breaking] Change delete() method for a.account_metastore_assignments account-level service to start returning databricks.sdk.service.catalog.AccountsDeleteMetastoreAssignmentResponse dataclass.
  • [Breaking] Change update() method for a.account_metastore_assignments account-level service to start returning databricks.sdk.service.catalog.AccountsUpdateMetastoreAssignmentResponse dataclass.
  • [Breaking] Change create() method for a.account_metastores account-level service to return databricks.sdk.service.catalog.AccountsCreateMetastoreResponse dataclass.
  • [Breaking] Change delete() method for a.account_metastores account-level service to start returning databricks.sdk.service.catalog.AccountsDeleteMetastoreResponse dataclass.
  • [Breaking] Change get() method for a.account_metastores account-level service to return databricks.sdk.service.catalog.AccountsGetMetastoreResponse dataclass.
  • [Breaking] Change list() method for a.account_metastores account-level service to return databricks.sdk.service.catalog.AccountsListMetastoresResponse dataclass.
  • [Breaking] Change update() method for a.account_metastores account-level service to return databricks.sdk.service.catalog.AccountsUpdateMetastoreResponse dataclass.
  • [Breaking] Change create() method for a.account_storage_credentials account-level service to return databricks.sdk.service.catalog.AccountsCreateStorageCredentialInfo dataclass.
  • [Breaking] Change delete() method for a.account_storage_credentials account-level service to start returning databricks.sdk.service.catalog.AccountsDeleteStorageCredentialResponse dataclass.
  • [Breaking] Change update() method for a.account_storage_credentials account-level service to return databricks.sdk.service.catalog.AccountsUpdateStorageCredentialResponse dataclass.
  • [Breaking] Change create() method for w.registered_models workspace-level service with new required argument order.
  • [Breaking] Change delete() method for a.credentials account-level service to start returning databricks.sdk.service.provisioning.Credential dataclass.

... (truncated)

Commits
  • 736a39a [Release] Release v0.68.0
  • aab24b0 Updated release workflow (#1060)
  • 1afb909 Fix slow unit tests for FilesExt (#1074)
  • 70d1788 Update SDK to latest API specification (#1059)
  • 21f8ff7 Add dbutils/remote to the user agent of calls made through the `RemoteDbUti...
  • 946de3e Add options for the Long Running Operation. (#1072)
  • 161c30a Add poll function for LRO which follows linear backoff with jitter. (#1065)
  • 156ec40 Add FieldMask class and helper functions (#1041)
  • ffa5653 Fix bearer tokens logged in exception messages (#1068)
  • 88f1047 Add native support for Azure DevOps OIDC authentication (#1027)
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [databricks-sdk](https://github.com/databricks/databricks-sdk-py) from 0.12.0 to 0.68.0.
- [Release notes](https://github.com/databricks/databricks-sdk-py/releases)
- [Changelog](https://github.com/databricks/databricks-sdk-py/blob/main/CHANGELOG.md)
- [Commits](databricks/databricks-sdk-py@v0.12.0...v0.68.0)

---
updated-dependencies:
- dependency-name: databricks-sdk
  dependency-version: 0.68.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Oct 15, 2025
@dependabot dependabot bot requested a review from a team as a code owner October 15, 2025 15:11
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Oct 15, 2025
@dependabot dependabot bot requested a review from dmoore247 October 15, 2025 15:11
@dependabot dependabot bot added the python Pull requests that update Python code label Oct 15, 2025
@dependabot @github
Copy link
Contributor Author

dependabot bot commented on behalf of github Oct 20, 2025

Superseded by #596.

@dependabot dependabot bot closed this Oct 20, 2025
@dependabot dependabot bot deleted the dependabot/pip/databricks-sdk-0.68.0 branch October 20, 2025 16:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file python Pull requests that update Python code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants